A Case Study on Bagging, Boosting, and Basic Ensembles of Neural Networks for OCR
نویسنده
چکیده
W e study the effectiveness of three neural network ensembles in improving OCR performance: ( i ) Basic, (ii) Bagging, and (iii) Boosting. Three random character degradation models are introduced in training indivadual networks in order to reduce error correlation between individual networks and to improve the generalization ability of neural networks. We compare the recognition accuracies of these three ensembles at various reject rates. An interesting discovery in our comparison i s that although the Boosting ensemble is slightly more accurate than the Basic ensemble and Bagging ensemble at zero reject rate, the advantage of the Boosting training over the Basic and Baggang ensembles quickly disappears as more patterns are rejected, Eventually the Basic and Bagging ensembles outperform the Boosting ensemble at high reject rates. Explanation of such a phenomenon is provided in the paper. W e also apply the optimal linear combiner (in the least square error sense) to each of the three ensembles to capture dafferent error correlatzon characteristics of the three ensembles. W e find that the optimal linear combiner is very effective in reducang mean square error, but is not necessarily as effective as a simply average method in reducing classijcation error.
منابع مشابه
Ensemble strategies to build neural network to facilitate decision making
There are three major strategies to form neural network ensembles. The simplest one is the Cross Validation strategy in which all members are trained with the same training data. Bagging and boosting strategies pro-duce perturbed sample from training data. This paper provides an ideal model based on two important factors: activation function and number of neurons in the hidden layer and based u...
متن کاملComparing ensembles of decision trees and neural networks for one-day-ahead streamflow prediction
Ensemble learning methods have received remarkable attention in the recent years and led to considerable advancement in the performance of the regression and classification problems. Bagging and boosting are among the most popular ensemble learning techniques proposed to reduce the prediction error of learning machines. In this study, bagging and gradient boosting algorithms are incorporated in...
متن کاملPopular Ensemble Methods: An Empirical Study
An ensemble consists of a set of individually trained classifiers (such as neural networks or decision trees) whose predictions are combined when classifying novel instances. Previous research has shown that an ensemble is often more accurate than any of the single classifiers in the ensemble. Bagging (Breiman, 1996c) and Boosting (Freund & Schapire, 1996; Schapire, 1990) are two relatively new...
متن کاملClass-switching neural network ensembles
This article investigates the properties of class-switching ensembles composed of neural networks and compares them to class-switching ensembles of decision trees and to standard ensemble learning methods, such as bagging and boosting. In a class-switching ensemble, each learner is constructed using a modified version of the training data. This modification consists in switching the class label...
متن کاملAn Empirical Evaluation of Bagging and BoostingRichard
An ensemble consists of a set of independently trained classiiers (such as neural networks or decision trees) whose predictions are combined when classifying novel instances. Previous research has shown that an ensemble as a whole is often more accurate than any of the single classi-ers in the ensemble. Bagging (Breiman 1996a) and Boosting (Freund & Schapire 1996) are two relatively new but pop...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2004